Flume and Kakfa example (KAKFA as Flume sink output to Kafka topic)To prepare the work:$sudo mkdir-p/flume/web_spooldir$sudo chmod a+w-r/flumeTo edit a flume configuration file:$ cat/home/tester/flafka/spooldir_kafka.conf# Name The components in this agentAgent1.sources = WeblogsrcAgent1.sinks = Kafka-sinkAgent1.channels = Memchannel# Configure The sourceAgent1.s
In the Kafka download page, download version 0.8, unzip.
1. Modify the server.properties inside the Config directory to host.name the IP of the machine. If the deployment Kafka and the development run Kafka example for the same machine without modification, with the default localhost also line.
2. Modify the DataDir pr
Kafka Consumer API Example 1. Auto-confirm OffsetDescription Reference: http://blog.csdn.net/xianzhen376/article/details/51167333Properties Props = new properties ();/* Defines the address of the KAKFA service and does not require all brokers to be specified on */props. put ("Bootstrap.servers","localhost:9092");/* Develop consumer group */props. put ("Group.id","Test");/* Whether to automatically confirm t
Decoration in ...KafkaNew TopicBin/kafka-topics. sh --create--zookeeper localhost:218131 --topic my-topicView a list of existing topicBin/kafka-topics. sh --list--zookeeper localhost:2181View the specified topic statusBin/kafka-topics. sh --describe--zookeeper localhost:2181 --topic my-topicStart consumer read message and output to standard outputBin/
("message"). ToString (). Contains ("A")) println ("Find A in message:" +map.tostring ())}}classRulefilelistenerbextendsStreaminglistener {override Def onbatchstarted (batchstarted: org.apache.spark.streaming.scheduler.StreamingListenerBatchStarted) {println ("-------------------------------------------------------------------------------------------------------------- -------------------------------") println ("Check whether the file's modified date is change, if change then reload the configu
) Conf directory, then copy zoo_sample.cfg to zoo.cfg4) Modify the Datadir=d:\zookeeper-3.3.6\zookeeper-3.3.6\data in Zoo.cfg(according to the decompression path to adjust accordingly)3. Start Zookeeper go to the bin directory and execute Zkserver.cmdOpen command Window in Bin directory: SHIFT + right mouse buttonInput: Zkserver.cmd Enter execution4 Kafka ConfigurationCompressed Package Decompression: D:\kafka_2.11-0.11.0.1Go to config directory, edit
Kafka is a message middleware for passing messages between systems, and messages can be persisted!Can be considered as a queue model, but also can be seen as a producer consumption model;The simple producer consumer client code is as follows: PackageCom.pt.util.kafka;Importjava.util.Date;Importjava.util.Properties;ImportKafka.javaapi.producer.Producer;ImportKafka.producer.KeyedMessage;ImportKafka.producer.ProducerConfig; Public classMyproducer { Publi
Storm.kafka.trident.TridentKafkaEmitter.emitNewPartitionBatCH (tridentkafkaemitter.java:79) at storm.kafka.trident.tridentkafkaemitter.access$000 (TridentKafkaEmitter.java : Storm.kafka.trident.tridentkafkaemitter$1.emitpartitionbatch (tridentkafkaemitter.java:204) at Storm.kafka.trident.tridentkafkaemitter$1.emitpartitionbatch (tridentkafkaemitter.java:194) at Storm.trident.spout.opaquepartitionedtridentspoutexecutor$emitter.emitbatch ( opaquepartitionedtridentspoutexecutor.java:127) at Storm.
In the previous "OpenSSL and keystore instruction small set," said the recent study of SSL encryption, will give a small example of Java. Copying a piece of code that can run to production is very irresponsible, but a small example can lead us to a quick glimpse of the nature of things. Rome was not built in a day.This article will give a small
Take FortiGate 60B as an example to illustrate how to configure SSL vpn! under the V3.0 system All Fortios V3.0 versions of the FortiGate firewall device (no model distinction) are applicable to this example reference.
Begin:
Firewall → address → new address
Virtual Private network →ssl→ settings
Address pool for
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.